# Japanese Text Generation

Slim Orpheus 3b JAPANESE Ft Q8 0 GGUF
Apache-2.0
This is a GGUF format model converted from the slim-orpheus-3b-JAPANESE-ft model, specifically optimized for Japanese text processing.
Large Language Model Japanese
S
Gapeleon
26
0
Diffllama 1B
Apache-2.0
DiffLlama-1B is a large language model pre-trained from scratch on approximately 100 billion tokens with around 1 billion parameters, innovatively adopting the 'Differential Transformer' architecture concept.
Large Language Model Japanese
D
kajuma
202
2
Gemma 2 2b Jpn It
Gemma 2 JPN is a Japanese-text fine-tuned version of the Gemma 2 2B model, excelling in Japanese language processing and suitable for various text generation tasks.
Large Language Model Transformers Japanese
G
google
7,510
183
Japanese Gpt Neox 3.6b
MIT
A Japanese GPT-NeoX model with 3.6 billion parameters, based on the Transformer architecture, trained on 312.5 billion tokens of Japanese corpus.
Large Language Model Transformers Supports Multiple Languages
J
rinna
34.74k
99
Japanese Gpt Neox Small
MIT
A small Japanese language model based on GPT-NeoX architecture, supporting text generation tasks
Large Language Model Transformers Supports Multiple Languages
J
rinna
838
15
Gpt2 Large Japanese
MIT
A large Japanese GPT-2 model trained by ABEJA, supporting Japanese text generation tasks
Large Language Model Transformers Supports Multiple Languages
G
abeja
960
18
Japanese Gpt2 Medium
MIT
A medium-scale Japanese GPT-2 model trained by rinna Co., Ltd., based on the Transformer architecture, suitable for Japanese text generation tasks.
Large Language Model Supports Multiple Languages
J
rinna
7,664
79
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase